Meeting the brief Investigation Plan & Design Create Evaluation References Summary

Leaving Certificate 2024 Computer Science Coursework Report

Student Number: 104564 Deadline: Tuesday 19th March 2024

Meeting the brief:

Video showing the artefact in operation

Meeting the basic and advanced requirements of the brief

Level RequirementDescription of meeting the requirement
Basic 1.Create a fully automated embedded system that utilises digital/analogue inputs and digital/analogue outputs to support the theme of wellbeing.Using pulsing audio bursts so the user can estimate distance from it's surroundings. These features greatly aid vision impaired/blind people to better gauge their surroundings and allow them to build more confidence in their movements and independence.
Basic 2.Validate and store the data gathered from the embedded system. The user can store settings and their audio preferences into the program aswell as tune the level too which the system notifies the user.
Basic 3.Create an analysis component that can be used to calculate or predict certain information and inform future decisions relating to wellbeing. The laser sensors estimate the shape of it's surroundings to keep the user safe. It also calculates the speed of the users hand and predicts time to collision adjusting it's level of warning accordingly.
   
Advanced 1. Using Python and/or JavaScript, create a computer model based on your own personally created dataset of wellbeing data or one that you have sourced externally (suggestions included on the next page). Your personal dataset could be generated manually, programmatically or by the embedded system. The dataset should contain multiple descriptive features of wellbeing and the model should be capable of answering a minimum of two ‘what if’ type questions which you will need to devise yourself. At the press of the button the device will save it's telemetry (distance from an obstacle and speed). With this data i can determine a function to give the user the optimal amount of warning at the right time.
Advanced 2. Each ‘what if’ question must use a minimum of three validated parameters (using at least two different data types) and, based on the information provided, offer the user insights in relation to some aspect of their wellbeing. Distance input from the sensor is checked for "overflow" (sensor reads 0 if distance is above max) in which case the distance is set to the maximum (Validation 1). The velocity is smoothed to avoid discrete jumps from the integer precision aswell as acting like a rough estimation of the momentum of the hand (Validation 2). From these two values we can calculate the estimated time until impact and play an extra loud warning sound if below a certain threshold, but first the code checks that both left and right sides are approaching an obstacle (Validation 3) to confirm the users hand isn't simply rotating. These parameters can answer the what-if questions when put into the function found by Advanced requirement 1.
Advanced 3. Users can view data in a graphical format which displays information such as their progress using the system or the results of a ‘what if’ scenario. The embedded system saves the number of collisions. It also displays the simulated position. The system can then be connected to a display at which point a program will graph these parameters for the family or the user's carer to view

'What-if' questions

QuestionExplanation
What-if the device is reading a high speed and the sensors report a low distance to obstacles? If the time until impact (distance/time) is below 0.2 seconds the device will play an extra loud tone.
What-if the device registers low distance(<250mm) and speed(<500mm/s) As the users movements are presumably below the warning threshold the device will assume the user is resting their hand. The device will smoothly turn off. Thanks to the placement of the sensors if an object is in the users hand the sensors will be in on either side of it allowing the device to still operate and detect obstacles
What-if the device sees an sudden deceleration within very short distance(<100mm) of an obstacle It will then save the date and time of the collision for the user's carer or family to view later.